Maximal Codeword Lengths in Huffman Codes

نویسنده

  • Y. S. Abu-Mostafa
چکیده

In this article, the authors consider the following question about Huffman coding, which is an important technique for compressing data from a discrete source. If p is the smallest source probability, how long, in terms of p, can the longest Huffman codeword be? It is shown that if p is in the range 0 < p _< I/2, and if K is the unique index such that I/FI<+3 < p _< 1/FK+2, where FK denotes the Kth Fibonacci number, then the Iongest tluffman codeword for a source whose least probability is p is at most K, and no better bound is possible. Asymptotically, this implies the surprising fact that for small values of p, a Huffman code's longest codeword can be as much as 44 percent larger than that of the corresponding Shannon code.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal Prefix Codes with Fewer Distinct Codeword Lengths are Faster to Construct

A new method for constructing minimum-redundancy prefix codes is described. This method does not explicitly build a Huffman tree; instead it uses a property of optimal codes to find the codeword length of each weight. The running time of the algorithm is shown to be O(nk), which is asymptotically faster than Huffman’s algorithm when k = o(log n), where n is the number of weights and k is the nu...

متن کامل

Minimum Redundancy Coding for Uncertain Sources

Consider the set of source distributions within a fixed maximum relative entropy with respect to a given nominal distribution. Lossless source coding over this relative entropy ball can be approached in more than one way. A problem previously considered is finding a minimax average length source code. The minimizing players are the codeword lengths — real numbers for arithmetic codes, integers ...

متن کامل

The Rényi redundancy of generalized Huffman codes

If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...

متن کامل

Distribution-Sensitive Construction of Minimum-Redundancy Prefix Codes

A new method for constructing minimum-redundancy prefix codes is described. This method does not build a Huffman tree; instead it uses a property of optimal codes to find the codeword length of each weight. The running time of the algorithm is shown to be O(nk), where n is the number of weights and k is the number of different codeword lengths. When the given sequence of weights is already sort...

متن کامل

Optimum Probability Distribution for Minimum

In the present communication, we have obtained the optimum probability distribution with which the messages should be delivered so that the average redundancy of the source is minimized. Here, we have taken the case of various generalized mean codeword lengths. Moreover, the upper bound to these codeword lengths has been found for the case of Huffman encoding.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000